==> Building on centiskorch ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list created directory packages/dns-lexicon ./ .SRCINFO 1,191 100% 0.00kB/s 0:00:00 1,191 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=3/5) .nvchecker.toml 89 100% 86.91kB/s 0:00:00 89 100% 86.91kB/s 0:00:00 (xfr#2, to-chk=2/5) PKGBUILD 1,751 100% 1.67MB/s 0:00:00 1,751 100% 1.67MB/s 0:00:00 (xfr#3, to-chk=1/5) dns-lexicon-3.18.0-2.log 607 100% 592.77kB/s 0:00:00 607 100% 592.77kB/s 0:00:00 (xfr#4, to-chk=0/5) sent 2,079 bytes received 138 bytes 4,434.00 bytes/sec total size is 3,229 speedup is 1.46 ==> Running extra-riscv64-build -- -d /home/felix/packages/riscv64-pkg-cache:/var/cache/pacman/pkg -l root16 on remote host... ]2;🔵 Container arch-nspawn-3444170 on centiskorch.felixc.at\[?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... there is nothing to do [?25h==> Building in chroot for [extra] (riscv64)... ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [root16]...done ==> Making package: dns-lexicon 3.18.0-2 (Sun Jan 5 14:02:08 2025) ==> Retrieving sources...  -> Cloning lexicon git repo... Cloning into bare repository '/home/felix/packages/dns-lexicon/lexicon'... remote: Enumerating objects: 27321, done. remote: Counting objects: 0% (1/307) remote: Counting objects: 1% (4/307) remote: Counting objects: 2% (7/307) remote: Counting objects: 3% (10/307) remote: Counting objects: 4% (13/307) remote: Counting objects: 5% (16/307) remote: Counting objects: 6% (19/307) remote: Counting objects: 7% (22/307) remote: Counting objects: 8% (25/307) remote: Counting objects: 9% (28/307) remote: Counting objects: 10% (31/307) remote: Counting objects: 11% (34/307) remote: Counting objects: 12% (37/307) remote: Counting objects: 13% (40/307) remote: Counting objects: 14% (43/307) remote: Counting objects: 15% (47/307) remote: Counting objects: 16% (50/307) remote: Counting objects: 17% (53/307) remote: Counting objects: 18% (56/307) remote: Counting objects: 19% (59/307) remote: Counting objects: 20% (62/307) remote: Counting objects: 21% (65/307) remote: Counting objects: 22% (68/307) remote: Counting objects: 23% (71/307) remote: Counting objects: 24% (74/307) remote: Counting objects: 25% (77/307) remote: Counting objects: 26% (80/307) remote: Counting objects: 27% (83/307) remote: Counting objects: 28% (86/307) remote: Counting objects: 29% (90/307) remote: Counting objects: 30% (93/307) remote: Counting objects: 31% (96/307) remote: Counting objects: 32% (99/307) remote: Counting objects: 33% (102/307) remote: Counting objects: 34% (105/307) remote: Counting objects: 35% (108/307) remote: Counting objects: 36% (111/307) remote: Counting objects: 37% (114/307) remote: Counting objects: 38% (117/307) remote: Counting objects: 39% (120/307) remote: Counting objects: 40% (123/307) remote: Counting objects: 41% (126/307) remote: Counting objects: 42% (129/307) remote: Counting objects: 43% (133/307) remote: Counting objects: 44% (136/307) remote: Counting objects: 45% (139/307) remote: Counting objects: 46% (142/307) remote: Counting objects: 47% (145/307) remote: Counting objects: 48% (148/307) remote: Counting objects: 49% (151/307) remote: Counting objects: 50% (154/307) remote: Counting objects: 51% (157/307) remote: Counting objects: 52% (160/307) remote: Counting objects: 53% (163/307) remote: Counting objects: 54% (166/307) remote: Counting objects: 55% (169/307) remote: Counting objects: 56% (172/307) remote: Counting objects: 57% (175/307) remote: Counting objects: 58% (179/307) remote: Counting objects: 59% (182/307) remote: Counting objects: 60% (185/307) remote: Counting objects: 61% (188/307) remote: Counting objects: 62% (191/307) remote: Counting objects: 63% (194/307) remote: Counting objects: 64% (197/307) remote: Counting objects: 65% (200/307) remote: Counting objects: 66% (203/307) remote: Counting objects: 67% (206/307) remote: Counting objects: 68% (209/307) remote: Counting objects: 69% (212/307) remote: Counting objects: 70% (215/307) remote: Counting objects: 71% (218/307) remote: Counting objects: 72% (222/307) remote: Counting objects: 73% (225/307) remote: Counting objects: 74% (228/307) remote: Counting objects: 75% (231/307) remote: Counting objects: 76% (234/307) remote: Counting objects: 77% (237/307) remote: Counting objects: 78% (240/307) remote: Counting objects: 79% (243/307) remote: Counting objects: 80% (246/307) remote: Counting objects: 81% (249/307) remote: Counting objects: 82% (252/307) remote: Counting objects: 83% (255/307) remote: Counting objects: 84% (258/307) remote: Counting objects: 85% (261/307) remote: Counting objects: 86% (265/307) remote: Counting objects: 87% (268/307) remote: Counting objects: 88% (271/307) remote: Counting objects: 89% (274/307) remote: Counting objects: 90% (277/307) remote: Counting objects: 91% (280/307) remote: Counting objects: 92% (283/307) remote: Counting objects: 93% (286/307) remote: Counting objects: 94% (289/307) remote: Counting objects: 95% (292/307) remote: Counting objects: 96% (295/307) remote: Counting objects: 97% (298/307) remote: Counting objects: 98% (301/307) remote: Counting objects: 99% (304/307) remote: Counting objects: 100% (307/307) remote: Counting objects: 100% (307/307), done. remote: Compressing objects: 0% (1/154) remote: Compressing objects: 1% (2/154) remote: Compressing objects: 2% (4/154) remote: Compressing objects: 3% (5/154) remote: Compressing objects: 4% (7/154) remote: Compressing objects: 5% (8/154) remote: Compressing objects: 6% (10/154) remote: Compressing objects: 7% (11/154) remote: Compressing objects: 8% (13/154) remote: Compressing objects: 9% (14/154) remote: Compressing objects: 10% (16/154) remote: Compressing objects: 11% (17/154) remote: Compressing objects: 12% (19/154) remote: Compressing objects: 13% (21/154) remote: Compressing objects: 14% (22/154) remote: Compressing objects: 15% (24/154) remote: Compressing objects: 16% (25/154) remote: Compressing objects: 17% (27/154) remote: Compressing objects: 18% (28/154) remote: Compressing objects: 19% (30/154) remote: Compressing objects: 20% (31/154) remote: Compressing objects: 21% (33/154) remote: Compressing objects: 22% (34/154) remote: Compressing objects: 23% (36/154) remote: Compressing objects: 24% (37/154) remote: Compressing objects: 25% (39/154) remote: Compressing objects: 26% (41/154) remote: Compressing objects: 27% (42/154) remote: Compressing objects: 28% (44/154) remote: Compressing objects: 29% (45/154) remote: Compressing objects: 30% (47/154) remote: Compressing objects: 31% (48/154) remote: Compressing objects: 32% (50/154) remote: Compressing objects: 33% (51/154) remote: Compressing objects: 34% (53/154) remote: Compressing objects: 35% (54/154) remote: Compressing objects: 36% (56/154) remote: Compressing objects: 37% (57/154) remote: Compressing objects: 38% (59/154) remote: Compressing objects: 39% (61/154) remote: Compressing objects: 40% (62/154) remote: Compressing objects: 41% (64/154) remote: Compressing objects: 42% (65/154) remote: Compressing objects: 43% (67/154) remote: Compressing objects: 44% (68/154) remote: Compressing objects: 45% (70/154) remote: Compressing objects: 46% (71/154) remote: Compressing objects: 47% (73/154) remote: Compressing objects: 48% (74/154) remote: Compressing objects: 49% (76/154) remote: Compressing objects: 50% (77/154) remote: Compressing objects: 51% (79/154) remote: Compressing objects: 52% (81/154) remote: Compressing objects: 53% (82/154) remote: Compressing objects: 54% (84/154) remote: Compressing objects: 55% (85/154) remote: Compressing objects: 56% (87/154) remote: Compressing objects: 57% (88/154) remote: Compressing objects: 58% (90/154) remote: Compressing objects: 59% (91/154) remote: Compressing objects: 60% (93/154) remote: Compressing objects: 61% (94/154) remote: Compressing objects: 62% (96/154) remote: Compressing objects: 63% (98/154) remote: Compressing objects: 64% (99/154) remote: Compressing objects: 65% (101/154) remote: Compressing objects: 66% (102/154) remote: Compressing objects: 67% (104/154) remote: Compressing objects: 68% (105/154) remote: Compressing objects: 69% (107/154) remote: Compressing objects: 70% (108/154) remote: Compressing objects: 71% (110/154) remote: Compressing objects: 72% (111/154) remote: Compressing objects: 73% (113/154) remote: Compressing objects: 74% (114/154) remote: Compressing objects: 75% (116/154) remote: Compressing objects: 76% (118/154) remote: Compressing objects: 77% (119/154) remote: Compressing objects: 78% (121/154) remote: Compressing objects: 79% (122/154) remote: Compressing objects: 80% (124/154) remote: Compressing objects: 81% (125/154) remote: Compressing objects: 82% (127/154) remote: Compressing objects: 83% (128/154) remote: Compressing objects: 84% (130/154) remote: Compressing objects: 85% (131/154) remote: Compressing objects: 86% (133/154) remote: Compressing objects: 87% (134/154) remote: Compressing objects: 88% (136/154) remote: Compressing objects: 89% (138/154) remote: Compressing objects: 90% (139/154) remote: Compressing objects: 91% (141/154) remote: Compressing objects: 92% (142/154) remote: Compressing objects: 93% (144/154) remote: Compressing objects: 94% (145/154) remote: Compressing objects: 95% (147/154) remote: Compressing objects: 96% (148/154) remote: Compressing objects: 97% (150/154) remote: Compressing objects: 98% (151/154) remote: Compressing objects: 99% (153/154) remote: Compressing objects: 100% (154/154) remote: Compressing objects: 100% (154/154), done. Receiving objects: 0% (1/27321) Receiving objects: 1% (274/27321) Receiving objects: 2% (547/27321) Receiving objects: 3% (820/27321) Receiving objects: 4% (1093/27321) Receiving objects: 5% (1367/27321) Receiving objects: 6% (1640/27321) Receiving objects: 7% (1913/27321) Receiving objects: 8% (2186/27321) Receiving objects: 9% (2459/27321) Receiving objects: 10% (2733/27321) Receiving objects: 11% (3006/27321) Receiving objects: 12% (3279/27321) Receiving objects: 13% (3552/27321) Receiving objects: 14% (3825/27321) Receiving objects: 15% (4099/27321) Receiving objects: 16% (4372/27321) Receiving objects: 17% (4645/27321) Receiving objects: 18% (4918/27321) Receiving objects: 19% (5191/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 20% (5465/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 21% (5738/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 22% (6011/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 23% (6284/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 24% (6558/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 25% (6831/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 26% (7104/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 27% (7377/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 28% (7650/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 29% (7924/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 30% (8197/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 31% (8470/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 32% (8743/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 33% (9016/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 34% (9290/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 35% (9563/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 36% (9836/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 37% (10109/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 38% (10382/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 39% (10656/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 40% (10929/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 41% (11202/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 42% (11475/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 43% (11749/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 44% (12022/27321), 2.98 MiB | 5.93 MiB/s Receiving objects: 44% (12239/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 45% (12295/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 46% (12568/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 47% (12841/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 48% (13115/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 49% (13388/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 50% (13661/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 51% (13934/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 52% (14207/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 53% (14481/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 54% (14754/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 55% (15027/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 56% (15300/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 57% (15573/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 58% (15847/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 59% (16120/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 60% (16393/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 61% (16666/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 62% (16940/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 63% (17213/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 64% (17486/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 65% (17759/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 66% (18032/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 67% (18306/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 68% (18579/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 69% (18852/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 70% (19125/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 71% (19398/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 72% (19672/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 73% (19945/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 74% (20218/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 75% (20491/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 76% (20764/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 77% (21038/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 78% (21311/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 79% (21584/27321), 6.79 MiB | 6.47 MiB/s Receiving objects: 80% (21857/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 81% (22131/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 82% (22404/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 83% (22677/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 84% (22950/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 85% (23223/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 86% (23497/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 87% (23770/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 88% (24043/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 89% (24316/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 90% (24589/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 91% (24863/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 92% (25136/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 93% (25409/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 93% (25459/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 94% (25682/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 95% (25955/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 96% (26229/27321), 9.89 MiB | 5.97 MiB/s Receiving objects: 97% (26502/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 98% (26775/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 99% (27048/27321), 14.83 MiB | 6.87 MiB/s remote: Total 27321 (delta 205), reused 177 (delta 148), pack-reused 27014 (from 2) Receiving objects: 100% (27321/27321), 14.83 MiB | 6.87 MiB/s Receiving objects: 100% (27321/27321), 15.51 MiB | 6.77 MiB/s, done. Resolving deltas: 0% (0/21304) Resolving deltas: 1% (214/21304) Resolving deltas: 2% (427/21304) Resolving deltas: 3% (641/21304) Resolving deltas: 4% (853/21304) Resolving deltas: 5% (1067/21304) Resolving deltas: 6% (1280/21304) Resolving deltas: 7% (1492/21304) Resolving deltas: 8% (1705/21304) Resolving deltas: 9% (1918/21304) Resolving deltas: 10% (2131/21304) Resolving deltas: 11% (2344/21304) Resolving deltas: 12% (2557/21304) Resolving deltas: 13% (2772/21304) Resolving deltas: 14% (2985/21304) Resolving deltas: 15% (3196/21304) Resolving deltas: 16% (3410/21304) Resolving deltas: 17% (3622/21304) Resolving deltas: 18% (3835/21304) Resolving deltas: 19% (4048/21304) Resolving deltas: 20% (4261/21304) Resolving deltas: 21% (4474/21304) Resolving deltas: 22% (4687/21304) Resolving deltas: 23% (4902/21304) Resolving deltas: 24% (5113/21304) Resolving deltas: 25% (5327/21304) Resolving deltas: 26% (5541/21304) Resolving deltas: 27% (5753/21304) Resolving deltas: 28% (5966/21304) Resolving deltas: 29% (6181/21304) Resolving deltas: 30% (6392/21304) Resolving deltas: 31% (6605/21304) Resolving deltas: 32% (6818/21304) Resolving deltas: 33% (7032/21304) Resolving deltas: 34% (7247/21304) Resolving deltas: 35% (7457/21304) Resolving deltas: 36% (7670/21304) Resolving deltas: 37% (7883/21304) Resolving deltas: 38% (8096/21304) Resolving deltas: 39% (8309/21304) Resolving deltas: 40% (8524/21304) Resolving deltas: 41% (8735/21304) Resolving deltas: 42% (8948/21304) Resolving deltas: 43% (9161/21304) Resolving deltas: 44% (9374/21304) Resolving deltas: 45% (9589/21304) Resolving deltas: 46% (9800/21304) Resolving deltas: 47% (10013/21304) Resolving deltas: 48% (10226/21304) Resolving deltas: 49% (10440/21304) Resolving deltas: 50% (10652/21304) Resolving deltas: 51% (10866/21304) Resolving deltas: 52% (11079/21304) Resolving deltas: 53% (11292/21304) Resolving deltas: 54% (11505/21304) Resolving deltas: 55% (11718/21304) Resolving deltas: 56% (11931/21304) Resolving deltas: 57% (12150/21304) Resolving deltas: 58% (12357/21304) Resolving deltas: 59% (12570/21304) Resolving deltas: 60% (12783/21304) Resolving deltas: 61% (13000/21304) Resolving deltas: 62% (13215/21304) Resolving deltas: 63% (13422/21304) Resolving deltas: 64% (13635/21304) Resolving deltas: 65% (13848/21304) Resolving deltas: 66% (14062/21304) Resolving deltas: 67% (14274/21304) Resolving deltas: 68% (14487/21304) Resolving deltas: 69% (14700/21304) Resolving deltas: 70% (14913/21304) Resolving deltas: 71% (15128/21304) Resolving deltas: 72% (15339/21304) Resolving deltas: 73% (15554/21304) Resolving deltas: 74% (15765/21304) Resolving deltas: 75% (15979/21304) Resolving deltas: 76% (16192/21304) Resolving deltas: 77% (16405/21304) Resolving deltas: 78% (16618/21304) Resolving deltas: 79% (16832/21304) Resolving deltas: 80% (17044/21304) Resolving deltas: 81% (17257/21304) Resolving deltas: 82% (17470/21304) Resolving deltas: 83% (17683/21304) Resolving deltas: 84% (17896/21304) Resolving deltas: 85% (18111/21304) Resolving deltas: 86% (18322/21304) Resolving deltas: 87% (18535/21304) Resolving deltas: 88% (18748/21304) Resolving deltas: 89% (18961/21304) Resolving deltas: 90% (19174/21304) Resolving deltas: 91% (19387/21304) Resolving deltas: 92% (19601/21304) Resolving deltas: 93% (19814/21304) Resolving deltas: 94% (20026/21304) Resolving deltas: 95% (20239/21304) Resolving deltas: 96% (20452/21304) Resolving deltas: 97% (20666/21304) Resolving deltas: 98% (20878/21304) Resolving deltas: 99% (21092/21304) Resolving deltas: 100% (21304/21304) Resolving deltas: 100% (21304/21304), done. ==> Validating source files with sha512sums... lexicon ... Passed ]2;🔵 Container arch-nspawn-3445616 on centiskorch.felixc.at\==> Making package: dns-lexicon 3.18.0-2 (Sun Jan 5 14:02:29 2025) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... warning: dependency cycle detected: warning: python-soupsieve will be installed before its python-beautifulsoup4 dependency Package (20) New Version Net Change Download Size extra/libyaml 0.2.5-3 0.16 MiB extra/mpdecimal 4.0.0-2 0.29 MiB core/python 3.13.1-1 108.57 MiB extra/python-cffi 1.17.1-2 1.35 MiB extra/python-charset-normalizer 3.4.1-1 0.44 MiB extra/python-filelock 3.16.1-2.1 0.13 MiB extra/python-idna 3.10-2 0.88 MiB extra/python-pycparser 2.22-3 1.69 MiB extra/python-requests-file 2.1.0-2 0.02 MiB extra/python-soupsieve 2.6-2 0.43 MiB extra/python-urllib3 2.3.0-1 1.26 MiB extra/python-zipp 3.21.0-2 0.08 MiB extra/python-beautifulsoup4 4.12.3-3 1.62 MiB extra/python-cryptography 44.0.0-1 5.12 MiB extra/python-dnspython 1:2.6.1-2 3.35 MiB 0.48 MiB extra/python-importlib-metadata 7.2.1-4.1 0.21 MiB 0.05 MiB extra/python-pyotp 2.9.0-3 0.10 MiB 0.03 MiB extra/python-requests 2.32.3-4.1 0.60 MiB extra/python-tldextract 5.1.3-2 0.44 MiB 0.11 MiB extra/python-yaml 6.0.2-2 0.91 MiB Total Download Size: 0.67 MiB Total Installed Size: 127.65 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-dnspython-1:2.6.1-2-any downloading... python-tldextract-5.1.3-2-any downloading... python-importlib-metadata-7.2.1-4.1-any downloading... python-pyotp-2.9.0-3-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing mpdecimal... installing python... Optional dependencies for python python-setuptools: for building Python packages using tooling that is usually bundled with Python python-pip: for installing Python packages using tooling that is usually bundled with Python python-pipx: for installing Python software not packaged on Arch Linux sqlite: for a default database integration [installed] xz: for lzma [installed] tk: for tkinter installing python-soupsieve... installing python-beautifulsoup4... Optional dependencies for python-beautifulsoup4 python-cchardet: alternative to autodetect character encodings python-chardet: to autodetect character encodings python-lxml: alternative HTML parser python-html5lib: alternative HTML parser installing python-pycparser... installing python-cffi... Optional dependencies for python-cffi python-setuptools: "limited api" version checking in cffi.setuptools_ext installing python-cryptography... installing libyaml... installing python-yaml... installing python-charset-normalizer... installing python-idna... installing python-urllib3... Optional dependencies for python-urllib3 python-brotli: Brotli support python-brotlicffi: Brotli support python-h2: HTTP/2 support python-pysocks: SOCKS support python-zstandard: Zstandard support installing python-requests... Optional dependencies for python-requests python-chardet: alternative character encoding library python-pysocks: SOCKS proxy support installing python-requests-file... installing python-filelock... installing python-tldextract... installing python-zipp... installing python-importlib-metadata... installing python-pyotp... installing python-dnspython... Optional dependencies for python-dnspython python-cryptography: DNSSEC support [installed] python-requests-toolbelt: DoH support python-idna: support for updated IDNA 2008 [installed] python-curio: async support python-trio: async support python-sniffio: async support :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (51) New Version Net Change Download Size core/dnssec-anchors 20190629-4 0.00 MiB 0.00 MiB extra/jemalloc 1:5.3.0-5 6.08 MiB core/libedit 20240808_3.1-1 0.25 MiB extra/libmaxminddb 1.11.0-1 0.04 MiB extra/liburcu 0.15.0-1 0.69 MiB extra/libuv 1.49.2-1 0.59 MiB extra/libxslt 1.1.42-2 0.77 MiB extra/perl-error 0.17029-7 0.04 MiB extra/perl-mailtools 2.22-1 0.10 MiB extra/perl-timedate 2.33-7 0.08 MiB extra/python-attrs 23.2.0-4 0.54 MiB extra/python-botocore 1.35.88-1 100.78 MiB 6.90 MiB extra/python-certifi 2024.12.14-1 0.02 MiB extra/python-click 8.1.7-4 1.18 MiB extra/python-dateutil 2.9.0-6.1 1.00 MiB extra/python-fastjsonschema 2.21.1-1 0.27 MiB extra/python-iniconfig 2.0.0-6 0.04 MiB extra/python-isodate 0.7.2-1 0.17 MiB extra/python-jmespath 1.0.1-5 0.21 MiB 0.04 MiB extra/python-lark-parser 1.2.2-3 1.24 MiB extra/python-lxml 5.3.0-2 4.52 MiB extra/python-markdown-it-py 3.0.0-4.1 0.68 MiB 0.14 MiB extra/python-mdurl 0.1.2-8 0.06 MiB extra/python-multidict 6.0.5-4 0.16 MiB extra/python-packaging 24.2-3 0.66 MiB extra/python-platformdirs 4.3.6-2 0.24 MiB extra/python-pluggy 1.5.0-3 0.20 MiB extra/python-prettytable 3.11.0-2 0.35 MiB 0.06 MiB extra/python-prompt_toolkit 3.0.48-2 4.39 MiB 0.70 MiB extra/python-pygments 2.18.0-3 14.14 MiB extra/python-pyproject-hooks 1.2.0-3 0.10 MiB extra/python-pytz 2024.2-2 0.15 MiB extra/python-requests-toolbelt 1.0.0-3 0.43 MiB extra/python-rich 13.9.4-3 3.13 MiB 0.52 MiB extra/python-s3transfer 0.10.4-1 0.91 MiB 0.14 MiB extra/python-six 1.16.0-10 0.12 MiB extra/python-typing_extensions 4.12.2-3 0.42 MiB extra/python-wcwidth 0.2.13-3 0.57 MiB extra/python-wrapt 1.16.0-4 0.25 MiB 0.05 MiB extra/python-yarl 1.9.4-4 0.31 MiB extra/bind 9.20.4-2 6.61 MiB 2.22 MiB extra/git 2.47.1-1 27.20 MiB extra/python-boto3 1.35.88-1 1.53 MiB 0.15 MiB extra/python-build 1.2.2-3 0.20 MiB extra/python-installer 0.7.0-10 0.17 MiB extra/python-localzone 0.9.8-6 0.07 MiB 0.02 MiB extra/python-poetry-core 1.9.1-1 1.28 MiB extra/python-pytest 1:8.3.4-1 3.92 MiB extra/python-softlayer 6.1.4-5 5.04 MiB 0.72 MiB extra/python-vcrpy 6.0.1-4 0.43 MiB 0.09 MiB extra/python-zeep 4.3.1-1 1.22 MiB 0.21 MiB Total Download Size: 11.96 MiB Total Installed Size: 193.53 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-botocore-1.35.88-1-any downloading... bind-9.20.4-2-riscv64 downloading... python-softlayer-6.1.4-5-any downloading... python-prompt_toolkit-3.0.48-2-any downloading... python-rich-13.9.4-3-any downloading... python-zeep-4.3.1-1-any downloading... python-boto3-1.35.88-1-any downloading... python-markdown-it-py-3.0.0-4.1-any downloading... python-s3transfer-0.10.4-1-any downloading... python-vcrpy-6.0.1-4-any downloading... python-prettytable-3.11.0-2-any downloading... python-wrapt-1.16.0-4-riscv64 downloading... python-jmespath-1.0.1-5-any downloading... python-localzone-0.9.8-6-any downloading... dnssec-anchors-20190629-4-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing perl-error... installing perl-timedate... installing perl-mailtools... installing git... Optional dependencies for git tk: gitk and git gui openssh: ssh transport and crypto perl-libwww: git svn perl-term-readkey: git svn and interactive.singlekey setting perl-io-socket-ssl: git send-email TLS support perl-authen-sasl: git send-email TLS support perl-mediawiki-api: git mediawiki support perl-datetime-format-iso8601: git mediawiki support perl-lwp-protocol-https: git mediawiki https support perl-cgi: gitweb (web interface) support python: git svn & git p4 [installed] subversion: git svn org.freedesktop.secrets: keyring credential helper libsecret: libsecret credential helper [installed] installing python-packaging... installing python-pyproject-hooks... installing python-build... Optional dependencies for python-build python-pip: to use as the Python package installer (default) python-uv: to use as the Python package installer python-virtualenv: to use virtualenv for build isolation installing python-installer... installing python-fastjsonschema... installing python-typing_extensions... installing python-lark-parser... Optional dependencies for python-lark-parser python-atomicwrites: for atomic_cache python-regex: for regex support installing python-poetry-core... installing python-iniconfig... installing python-pluggy... installing python-pytest... installing python-wrapt... installing python-multidict... installing python-yarl... installing python-vcrpy... installing python-certifi... installing python-six... installing python-dateutil... installing python-jmespath... installing python-botocore... Optional dependencies for python-botocore python-awscrt installing python-s3transfer... Optional dependencies for python-s3transfer python-awscrt installing python-boto3... Optional dependencies for python-boto3 python-awscrt: AWS CRT S3 transfers installing python-localzone... installing python-wcwidth... installing python-prettytable... installing python-click... installing python-pygments... installing python-prompt_toolkit... installing python-mdurl... installing python-markdown-it-py... Optional dependencies for python-markdown-it-py python-mdit_py_plugins: core plugins python-linkify-it-py: linkify extension installing python-rich... installing python-softlayer... installing python-attrs... installing python-isodate... installing libxslt... Optional dependencies for libxslt python: Python bindings [installed] installing python-lxml... Optional dependencies for python-lxml python-beautifulsoup4: support for beautifulsoup parser to parse not well formed HTML [installed] python-cssselect: support for cssselect python-html5lib: support for html5lib parser python-lxml-docs: offline docs python-lxml-html-clean: enable htmlclean feature installing python-platformdirs... installing python-requests-toolbelt... installing python-pytz... installing python-zeep... installing dnssec-anchors... installing libedit... installing libmaxminddb... Optional dependencies for libmaxminddb geoip2-database: IP geolocation databases installing libuv... installing jemalloc... Optional dependencies for jemalloc perl: for jeprof [installed] installing liburcu... installing bind... :: Running post-transaction hooks... (1/5) Creating system user accounts... Creating group 'named' with GID 40. Creating user 'named' (BIND DNS Server) with UID 40 and GID 40. Creating group 'git' with GID 972. Creating user 'git' (git daemon user) with UID 972 and GID 972. (2/5) Reloading system manager configuration... Skipped: Current root is not booted. (3/5) Creating temporary files... (4/5) Arming ConditionNeedsUpdate... (5/5) Warn about old perl modules [?25h==> Retrieving sources... ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Creating working copy of lexicon git repo... Cloning into 'lexicon'... done. Updating files: 98% (2385/2411) Updating files: 99% (2387/2411) Updating files: 100% (2411/2411) Updating files: 100% (2411/2411), done. Switched to a new branch 'makepkg' ==> Starting build()... * Getting build dependencies for wheel... * Building wheel... Successfully built dns_lexicon-3.18.0-py3-none-any.whl ==> Starting check()... ============================= test session starts ============================== platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 rootdir: /build/dns-lexicon/src/lexicon configfile: pyproject.toml collected 2422 items / 56 deselected / 2366 selected tests/providers/test_aliyun.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 1%] tests/providers/test_aurora.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 2%] tests/providers/test_auto.py .F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 3%] tests/providers/test_azure.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 4%] tests/providers/test_cloudflare.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 5%] tests/providers/test_cloudns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 7%] tests/providers/test_cloudxns.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 8%] tests/providers/test_conoha.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 9%] tests/providers/test_constellix.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 10%] tests/providers/test_ddns.py ssssssssssssssssssssssssssss [ 11%] tests/providers/test_digitalocean.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 13%] tests/providers/test_dinahosting.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 14%] tests/providers/test_directadmin.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 15%] tests/providers/test_dnsimple.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 16%] tests/providers/test_dnsmadeeasy.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 17%] tests/providers/test_dnspark.py F.FFFFFFFFFsssFFFFFsFF [ 18%] tests/providers/test_dnspod.py F.FFFFFFFFFsssFFFFFsFF [ 19%] tests/providers/test_dnsservices.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 20%] tests/providers/test_dreamhost.py ...F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 22%] tests/providers/test_duckdns.py .ssFsFFFsFFFFFssssssFF.F.sFss [ 23%] tests/providers/test_dynu.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 24%] tests/providers/test_easydns.py F.FFFFFFFFFFssFFFFFsFF [ 25%] tests/providers/test_easyname.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 26%] tests/providers/test_euserv.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 27%] tests/providers/test_exoscale.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 29%] tests/providers/test_flexibleengine.py F.FFFFFFFFFFFsFFsssssFsFFFFF [ 30%] tests/providers/test_gandi.py .................ss.........F.FFFFFFFFFFFF [ 31%] FFFssFFFFFFFFF [ 32%] tests/providers/test_gehirn.py F.FFFFFFFFFFssFFFFFFFF [ 33%] tests/providers/test_glesys.py F.FFFFFFFFFFssFFFFFsFF [ 34%] tests/providers/test_godaddy.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 35%] tests/providers/test_googleclouddns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 36%] tests/providers/test_gransy.py FFFFFFFFFFFFFFFFFssFFFFFFFFF [ 37%] tests/providers/test_gratisdns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 39%] tests/providers/test_henet.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 40%] tests/providers/test_hetzner.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 41%] tests/providers/test_hostingde.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 42%] tests/providers/test_hover.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 43%] tests/providers/test_infoblox.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 45%] tests/providers/test_infomaniak.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 46%] tests/providers/test_internetbs.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 47%] tests/providers/test_inwx.py .................ss......... [ 48%] tests/providers/test_joker.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 49%] tests/providers/test_linode.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 51%] tests/providers/test_linode4.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 52%] tests/providers/test_localzone.py ss...............ss......... [ 53%] tests/providers/test_luadns.py F.FsFFFFFFFFFFFFFssFFFFFFsFF [ 54%] tests/providers/test_memset.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 55%] tests/providers/test_misaka.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 56%] tests/providers/test_mythicbeasts.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 58%] tests/providers/test_namecheap.py ssssssssssssssssssssssssssssssssssssss [ 59%] ssssssssssssssssss [ 60%] tests/providers/test_namecom.py F.FFFFFFFFFFFFFFFFFFFFFssFFFFFFFFFFFFFF. [ 62%] [ 62%] tests/providers/test_namesilo.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 63%] tests/providers/test_netcup.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 64%] tests/providers/test_nfsn.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 65%] tests/providers/test_njalla.py F.FFFFFFsFFFFFsFFssFFFFFFFFF [ 66%] tests/providers/test_nsone.py F.FFFFFFFFFFFFFsFssFFFFFFsFF [ 68%] tests/providers/test_onapp.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 69%] tests/providers/test_online.py F.FFFFFFFFFFsFFFFssFFFFFFFFF [ 70%] tests/providers/test_ovh.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 71%] tests/providers/test_plesk.py F.FFFFFFFFFFFFFsFssFFFFFFFFF [ 72%] tests/providers/test_pointhq.py F.FFFFFFFFFFFFFsFssFFFFFFsFF [ 74%] tests/providers/test_porkbun.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 75%] tests/providers/test_powerdns.py F.FFFFFFFFFFFFFFFssFFFFFFsFF [ 76%] tests/providers/test_rackspace.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 77%] tests/providers/test_rage4.py F.FFFFFFFFFFFFFFFssFFFFFssFs [ 78%] tests/providers/test_rcodezero.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 79%] tests/providers/test_route53.py FF..FFFFFFFFFFFFFFFssFFFFFFFFF [ 81%] tests/providers/test_safedns.py F.FsFFFFFFFFFFFsFssFFFFFFFFF [ 82%] tests/providers/test_sakuracloud.py F.FFFFFFFFsFssFFFFFFFF [ 83%] tests/providers/test_softlayer.py F.FFFFFFFFFFssFFFFFFFF [ 84%] tests/providers/test_timeweb.py F.ssFFFFFFFFFFFsFssFFFFFFFFF [ 85%] tests/providers/test_transip.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 86%] tests/providers/test_ultradns.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 87%] tests/providers/test_valuedomain.py ssssssssssssssssssssssssssss [ 88%] tests/providers/test_vercel.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 90%] tests/providers/test_vultr.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 91%] tests/providers/test_webgo.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 92%] tests/providers/test_wedos.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 93%] tests/providers/test_yandex.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 94%] tests/providers/test_yandexcloud.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 96%] tests/providers/test_zilore.py F.FFFFFFFFFFFFFFFssFFFFFFFFF [ 97%] tests/providers/test_zonomi.py F.FFFFFFFFFFssFFFFFsFF [ 98%] tests/test_client.py ............ [ 98%] tests/test_config.py ......... [ 99%] tests/test_library.py ............ [ 99%] tests/test_output.py ..... [ 99%] tests/test_parser.py ..... [100%] =================================== FAILURES =================================== ________________ AliyunProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=S4h3tEl9NrSrwLivfNQ84%2BC%2FXc0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=iJ9C9WM835Bsp5O79JDbN4eNZVU%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A16%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=zD1phRw3XSE%2BEffLflBffKqeAiI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=szCG64Nlnzm7L3VmzP3LdJPVYNo%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=eUlIU1hIjDcUYbeTbPTdFKMKdJI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=xg%2BOvg6rVfEqd3ImFYuJb7ATRSs%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A17%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=Bs0vF%2BoHw8zt%2FoCc7RYOvSpV5dM%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=AQL09uAYSGnQ6MmiDTWSywJQ2TI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=a4OfaGmWIJkfJ8NsuXy%2FaZ2lQY0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=rQUT7%2BD1VqHKv2b4EGetrwlYdj0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=NQYJg0e2LdSs4YC0jxc60z3hSDE%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A18%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=KNzyhzpNeE3q62zxhh15fycsnC4%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=NhL8myHumP4VmUfmbIBkpVRvMU0%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=RRdGVREKGmS81822egJXylosGis%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=bt547ore9WU61qfJ6xauhddenYo%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A19%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=z25NYXh287YW5%2F43AMQgOoW290M%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=4LjLz%2F8YIRK5PJ%2BXrtKOvQm9TdU%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=bzDFsCFxZl5yWhlRL0nqCBBQwhE%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=EEVT0thscl1R1Z3d%2FGqKG%2Bd6KYI%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=LWY9Dp3v%2FRURS49DxNUC3qiHdlk%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...tamp=2025-01-05T19%3A03%3A20%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=HkP7MjhyXHYlTMmtTvv8%2F4rLEqY%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=uaVf1ZL6UClSQYkKgiThHFs6Tsk%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=ag%2BBU%2FHCyK%2Fy8WyHwu3wJf2YOs4%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...estamp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=UMWw6QeNSsLsN2eLFPD6io5C4co%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AliyunProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aliyun.py:46: in authenticate ??? src/lexicon/_private/providers/aliyun.py:192: in _request_aliyun ??? src/lexicon/_private/providers/aliyun.py:170: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?Action=DescribeDomainInfo&DomainName=mean.space&Format=json&Version=2015-01-09&SignatureMethod=HMAC-SHA1&SignatureV...mp=2025-01-05T19%3A03%3A21%2B00%3A00Z&AccessKeyId=placeholder_auth_key_id&Signature=7wgG7f%2Bzql05sHnUtkB8K5s%2Fh1M%3D' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ AuroraProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...RpZmVuQU1KS1RDdjQ5QUx5bnEvR1UrVWRTZE9hVU1UbkFVTXRuVzRCeE09', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...pyMEpOdFdiajczU2VKSStYckpIbktkZzR3UEtEMEtTRGxuTEh2K1p6cDA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...hSa2pXbjZCMDhZa1ZKeWNJMmpnTmppNlVjTXplcnhGaVMzSG1GU0lMRVE9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...VRdmFZT2d5eU9GajdhNHQwSExzUmQ5UnRDNmZBdDlxUUhHcXFZcjViU3c9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...JrN3hVMEt6VlhnaEs5bFRqZE5aSVVBc1VMdzQreEtvZWlWQ3F1SVhqdnM9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...dOQ0d3U3o1TmdLVENIaVVxci8yQUVXcWxlNGZZSE15WnhqWXlQYlNRQjA9', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...lFMHJzQmQzYk5PKzNZUGZibFh4NUR6S0k3V2E3MmJTcWFUYzIvREFTTm89', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AuroraProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/aurora.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/aurora.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...lFMHJzQmQzYk5PKzNZUGZibFh4NUR6S0k3V2E3MmJTcWFUYzIvREFTTm89', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _________________ AutoProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AutoProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/auto.py:242: in authenticate ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ____________________ AzureTests.test_provider_authenticate _____________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____ AzureTests.test_provider_when_calling_list_records_after_setting_ttl _____ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ___ AzureTests.test_provider_when_calling_update_record_should_modify_record ___ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ AzureTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/azure.py:266: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/placeholder_auth_tenant_id/oauth2/token' body = 'grant_type=client_credentials&client_id=placeholder_auth_client_id&client_secret=placeholder_auth_client_secret&resource=https%3A%2F%2Fmanagement.azure.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '155', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ CloudflareProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudflareProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudflare.py:56: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudflare.py:210: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/client/v4/zones?name=pacalis.net', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ CloudnsProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudnsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudns.py:45: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudns.py:213: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/dns/get-zone-info.json?domain-name=api-example.com&auth-id=placeholder_auth_id&auth-password=placeholder_auth_password' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ CloudXNSProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:51 2025', 'API-HMAC': 'afc88b66a99180a80ee47acba00999c9', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:51 2025', 'API-HMAC': 'afc88b66a99180a80ee47acba00999c9', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:51 2025', 'API-HMAC': 'afc88b66a99180a80ee47acba00999c9', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:51 2025', 'API-HMAC': 'afc88b66a99180a80ee47acba00999c9', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:51 2025', 'API-HMAC': 'afc88b66a99180a80ee47acba00999c9', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:52 2025', 'API-HMAC': 'f806d7d54674775d4876b39739d8f699', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:52 2025', 'API-HMAC': 'f806d7d54674775d4876b39739d8f699', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:52 2025', 'API-HMAC': 'f806d7d54674775d4876b39739d8f699', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:52 2025', 'API-HMAC': 'f806d7d54674775d4876b39739d8f699', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:52 2025', 'API-HMAC': 'f806d7d54674775d4876b39739d8f699', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:53 2025', 'API-HMAC': '0da6d997e8f9248242c0afb9d588d016', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:53 2025', 'API-HMAC': '0da6d997e8f9248242c0afb9d588d016', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:53 2025', 'API-HMAC': '0da6d997e8f9248242c0afb9d588d016', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:53 2025', 'API-HMAC': '0da6d997e8f9248242c0afb9d588d016', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:54 2025', 'API-HMAC': '26e9d667ba48510d36751b15ed2924e7', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:54 2025', 'API-HMAC': '26e9d667ba48510d36751b15ed2924e7', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:54 2025', 'API-HMAC': '26e9d667ba48510d36751b15ed2924e7', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:54 2025', 'API-HMAC': '26e9d667ba48510d36751b15ed2924e7', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:54 2025', 'API-HMAC': '26e9d667ba48510d36751b15ed2924e7', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:55 2025', 'API-HMAC': '8559c51746073d42fe015e795ed02dce', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:55 2025', 'API-HMAC': '8559c51746073d42fe015e795ed02dce', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:55 2025', 'API-HMAC': '8559c51746073d42fe015e795ed02dce', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:55 2025', 'API-HMAC': '8559c51746073d42fe015e795ed02dce', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ CloudXNSProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/cloudxns.py:41: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/cloudxns.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/domain' body = '{"login_token": "placeholder_auth_username,placeholder_auth_token", "format": "json"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...un Jan 05 14:03:56 2025', 'API-HMAC': '62477430e04e26991ba315bc361846cc', 'API-FORMAT': 'json', 'Content-Length': '85'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ ConohaProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConohaProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/conoha.py:91: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/conoha.py:190: in _request ??? src/lexicon/_private/providers/conoha.py:199: in _send_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains?name=narusejun.com.', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ ConstellixProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103841924', 'x-cnsdns-hmac': b'wPuFllgdjGqAAQXD3zEuVWxV3bs=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103842159', 'x-cnsdns-hmac': b'V/wbK1P4dSvEnJMWKj7XTctYNyw=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103842371', 'x-cnsdns-hmac': b'we5NqBEStQ/qc51VdM/flIn0//o=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103842590', 'x-cnsdns-hmac': b'DEH+Jcc7rHHse1fH32CsQaArLZM=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103842800', 'x-cnsdns-hmac': b'UvWBtjyLK+k5nIqdbxM3rBGl9nM=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103843010', 'x-cnsdns-hmac': b'0ljUkAOBYOeiL5ekT+NRxXkE9do=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103843442', 'x-cnsdns-hmac': b'68rrP+j9/Q2zcSpbCqaIOKBpMRU=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103843661', 'x-cnsdns-hmac': b'L2fS9nPn9Y97mC/XOBZYvP6ysrI=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103843879', 'x-cnsdns-hmac': b'w0L+fw4x09njPTtcf1YnKAEnK+0=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103844104', 'x-cnsdns-hmac': b'WsS9L2ve7oiPw+twtKNVe+VPqpI=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103844321', 'x-cnsdns-hmac': b'pNrAnyKvoqFybnMTvG09AWrvDQ8=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103844540', 'x-cnsdns-hmac': b'VaGQ/UPP5ZpXH26M+Uz2YCMXy7c=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103844773', 'x-cnsdns-hmac': b'fdF9TmLqfOGaJp83TuY+4h2gDf8=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103844996', 'x-cnsdns-hmac': b'aP3olM8g9RAHiyWS90DA9JONMi0=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103845210', 'x-cnsdns-hmac': b'2Gn98VP4BloEpYa3BZc0laY3s6E=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103845437', 'x-cnsdns-hmac': b'BK174dMZ5MRsaywmeFRnf8T6zcw=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103845656', 'x-cnsdns-hmac': b'oYEhebaFua8mJlwUNPnW1bgNrxw=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103845868', 'x-cnsdns-hmac': b'TaYM+ffXaiSO15qyhi2x51R2bz8=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103846082', 'x-cnsdns-hmac': b'W/kdlAalwFbCy6Iqfpttmq7gzKw=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103846293', 'x-cnsdns-hmac': b'zPBFXc9ecWWnx2T4xpGAuCzIcOA=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103846505', 'x-cnsdns-hmac': b'eYFCrMbGUMCCr+DJ2Ym5lSA+YaA=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103846723', 'x-cnsdns-hmac': b'28dEOYP94cN4n02ix7yIpA0wK74=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103846941', 'x-cnsdns-hmac': b'Jnpp+t/IC4lIL/pZDmzQ14lOfmg=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103847156', 'x-cnsdns-hmac': b'RXSnI4GKJcU8lV0y8rtTBbGuR+A=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ConstellixProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/constellix.py:58: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/constellix.py:271: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ame', 'x-cnsdns-requestDate': '1736103847379', 'x-cnsdns-hmac': b'6wye4yLFGoHev49+MwU2kcUHmM4=', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ DigitalOceanProviderTests.test_provider_authenticate _____________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DigitalOceanProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/digitalocean.py:32: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/digitalocean.py:156: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains/foxwoodswebsites.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ DinahostingProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DinahostingProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dinahosting.py:66: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dinahosting.py:225: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/special/api.php' body = 'command=Services_GetDomains&responseType=Json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ DirectAdminProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DirectAdminProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/directadmin.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/directadmin.py:223: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/CMD_API_SHOW_DOMAINS?domain=example.com', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ DnsimpleProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsimpleProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsimple.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsimple.py:211: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/accounts', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ DnsmadeeasyProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:30 GMT', 'x-dnsme-hmac': 'bd68887cde9220380a3a7e5b7c9149b7e7ac19df', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:30 GMT', 'x-dnsme-hmac': 'bd68887cde9220380a3a7e5b7c9149b7e7ac19df', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:30 GMT', 'x-dnsme-hmac': 'bd68887cde9220380a3a7e5b7c9149b7e7ac19df', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:30 GMT', 'x-dnsme-hmac': 'bd68887cde9220380a3a7e5b7c9149b7e7ac19df', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:31 GMT', 'x-dnsme-hmac': '0226dc6d37884dfdebf62dd774369f3ec6b952d4', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:31 GMT', 'x-dnsme-hmac': '0226dc6d37884dfdebf62dd774369f3ec6b952d4', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:31 GMT', 'x-dnsme-hmac': '0226dc6d37884dfdebf62dd774369f3ec6b952d4', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:31 GMT', 'x-dnsme-hmac': '0226dc6d37884dfdebf62dd774369f3ec6b952d4', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:31 GMT', 'x-dnsme-hmac': '0226dc6d37884dfdebf62dd774369f3ec6b952d4', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:32 GMT', 'x-dnsme-hmac': '8aeed09a0a2eab1842d2af7f679c57df97b6fcb6', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:32 GMT', 'x-dnsme-hmac': '8aeed09a0a2eab1842d2af7f679c57df97b6fcb6', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:32 GMT', 'x-dnsme-hmac': '8aeed09a0a2eab1842d2af7f679c57df97b6fcb6', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:33 GMT', 'x-dnsme-hmac': 'd67294abc2e9d4483d32780f3c798eab3b6315d0', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:33 GMT', 'x-dnsme-hmac': 'd67294abc2e9d4483d32780f3c798eab3b6315d0', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:33 GMT', 'x-dnsme-hmac': 'd67294abc2e9d4483d32780f3c798eab3b6315d0', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:33 GMT', 'x-dnsme-hmac': 'd67294abc2e9d4483d32780f3c798eab3b6315d0', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:33 GMT', 'x-dnsme-hmac': 'd67294abc2e9d4483d32780f3c798eab3b6315d0', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:34 GMT', 'x-dnsme-hmac': 'b7bdfada95756ee0ab4f60292ac89df6a51a9bbc', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:34 GMT', 'x-dnsme-hmac': 'b7bdfada95756ee0ab4f60292ac89df6a51a9bbc', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:34 GMT', 'x-dnsme-hmac': 'b7bdfada95756ee0ab4f60292ac89df6a51a9bbc', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:34 GMT', 'x-dnsme-hmac': 'b7bdfada95756ee0ab4f60292ac89df6a51a9bbc', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:34 GMT', 'x-dnsme-hmac': 'b7bdfada95756ee0ab4f60292ac89df6a51a9bbc', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:35 GMT', 'x-dnsme-hmac': 'e6f3b2e80f5217cddcb61ccfdf075813c39231ba', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsmadeeasyProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsmadeeasy.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnsmadeeasy.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/V2.0/dns/managed/name?domainname=capsulecd.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...e': 'Sun, 05 Jan 2025 19:04:35 GMT', 'x-dnsme-hmac': 'e6f3b2e80f5217cddcb61ccfdf075813c39231ba', 'Content-Length': '2'} retries = _RetryRateLimit(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ DnsParkProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsParkProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspark.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dnspark.py:137: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ DnsPodProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DnsPodProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnspod.py:33: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/dnspod.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/Domain.Info' body = 'domain=capsulecd.com&login_token=placeholder_auth_username%2Cplaceholder_auth_token&format=json' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '95', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ DNSservicesProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DNSservicesProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dnsservices.py:47: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/login' body = 'username=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '69', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ DreamhostProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DreamhostProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dreamhost.py:107: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dreamhost.py:241: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/?key=placeholder_auth_token&format=json&cmd=dns-list_records' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DuckdnsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:127: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&ip=127.0.0.1&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:476: in test_provider_when_calling_create_record_with_duplicate_records_should_be_noop ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:304: in test_provider_when_calling_delete_record_by_filter_should_remove_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:328: in test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:229: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:314: in test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:295: in test_provider_when_calling_delete_record_by_identifier_should_remove_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:127: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:200: in test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:186: in test_provider_when_calling_list_records_with_full_name_filter_should_return_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:174: in test_provider_when_calling_list_records_with_name_filter_should_return_record ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _ DuckdnsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = mock_rrset = > ??? tests/providers/test_duckdns.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:39: in wrapper ??? tests/providers/integration_tests.py:252: in test_provider_when_calling_update_record_should_modify_record_name_specified ??? src/lexicon/_private/providers/duckdns.py:152: in create_record ??? src/lexicon/_private/providers/duckdns.py:242: in _api_call ??? src/lexicon/_private/providers/duckdns.py:274: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/update?domains=testlexicon&txt=challengetoken&token=placeholder_auth_token&verbose=true' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ------------------------------ Captured log call ------------------------------- WARNING lexicon._private.providers.duckdns:duckdns.py:133 create_record: Duck DNS does not support modifying the TTL, ignoring 3600 _________________ DynuProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ DynuProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/dynu.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/dynu.py:166: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/dns', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ EasyDnsProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasyDnsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easydns.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/easydns.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/domain/easydnstemp.com?format=json&_user=placeholder_auth_username&_key=placeholder_auth_token' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ EasynameProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EasynameProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/easyname.py:93: in authenticate ??? src/lexicon/_private/providers/easyname.py:57: in _get_csrf_token ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/en/login', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ EUservProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ EUservProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/euserv.py:65: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/euserv.py:236: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/?method=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ ExoscaleProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ExoscaleProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/exoscale.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/exoscale.py:173: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains/lexicontest.com', body = b'{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...-DNS-Token': 'placeholder_auth_key:placeholder_auth_secret', 'Content-Length': '2', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ____________ FlexibleEngineProviderTests.test_provider_authenticate ____________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FlexibleEngineProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/flexibleengine.py:38: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/flexibleengine.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones?name=flexibleengine.test', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-Auth-Token': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ GandiRESTProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GandiRESTProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gandi.py:95: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gandi.py:288: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/livedns/domains/t18s.fr', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Apikey placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ GehirnProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GehirnProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gehirn.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/gehirn.py:313: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ GlesysProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GlesysProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/glesys.py:31: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/glesys.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/domain/list?format=json', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ GoDaddyProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoDaddyProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/godaddy.py:62: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/godaddy.py:332: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/domains/fullm3tal.online', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'sso-key placeholder_auth_key:placeholder_auth_secret'} retries = Retry(total=10, connect=None, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ GoogleCloudDnsTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...9Cm2CdInca1Tb3lhDWArDgUl18yqn4QH9z8mrzq-Nbfclit9BRj0Esb9wNYzzI-XzDrd7oSkj0VlZPYR-SAMGBWDxUMTsY8Sfy-PytkGmO08EzwQ%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...dtefOMBqjfntCAixN_MC_zWSbnAn_H84QKBB-OQodRZhnjZcPPZwvKgU7TudrfrFSMXArFjartpGBZPKLcVUiuoSZS0FLNGE8XZP9u_A6aSVZuFw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...dtefOMBqjfntCAixN_MC_zWSbnAn_H84QKBB-OQodRZhnjZcPPZwvKgU7TudrfrFSMXArFjartpGBZPKLcVUiuoSZS0FLNGE8XZP9u_A6aSVZuFw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...32pJPGll8f_BJVB0AHir2bGWH9J57hawyWzWADKWRnGKgtHdcHJFjjdYcgyOOr8ek3Xf9HEPWohbAb_YoXvvoImXugbRU1Nb6R4OQTWCUtRfuudA%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...32pJPGll8f_BJVB0AHir2bGWH9J57hawyWzWADKWRnGKgtHdcHJFjjdYcgyOOr8ek3Xf9HEPWohbAb_YoXvvoImXugbRU1Nb6R4OQTWCUtRfuudA%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...lyGUhLQHV_YFnSeAafq1A96mo1PIyYUMGve4mXpu8lG-XeQA9xosRR49a-QOFN1TA1zy8SrP6gHNAcKw4knbqZqzDH4ALytMLPQDwz5ndwaXiU8g%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...qTXkEqnfXffGUayNPub6m73Rj1Rtrp9gX93J2XpEO77IOHIa-EPAL00_5vpYUBTL0fzNCsYqJuG3NEq5TnOz5yBopmEEGnSTEwqUom-sb7XB6Kqg%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...qTXkEqnfXffGUayNPub6m73Rj1Rtrp9gX93J2XpEO77IOHIa-EPAL00_5vpYUBTL0fzNCsYqJuG3NEq5TnOz5yBopmEEGnSTEwqUom-sb7XB6Kqg%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...pO761cIm0DjJVeHcZUadx6bWD4CE_cguZBHg-KB6ciPipOpiEyVBaqMT1upEin1LruHkk9CC0oH1VWS1tD2ZfAe4StJmfeOi40wIp74yMzfg3HjQ%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...pO761cIm0DjJVeHcZUadx6bWD4CE_cguZBHg-KB6ciPipOpiEyVBaqMT1upEin1LruHkk9CC0oH1VWS1tD2ZfAe4StJmfeOi40wIp74yMzfg3HjQ%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e..._JWNo5RdcE8kbN5ymg4BAAZIk-nTEl1bO76cvtZWNMUmQEoG88dzNac_-a9J3pfk9P2kE2On2_VZyB9lK-f5FO5vUw4PRBt3QbArvRFzdfRAymcg%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e..._JWNo5RdcE8kbN5ymg4BAAZIk-nTEl1bO76cvtZWNMUmQEoG88dzNac_-a9J3pfk9P2kE2On2_VZyB9lK-f5FO5vUw4PRBt3QbArvRFzdfRAymcg%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...nBfWsf01Lpd94pEheqBCGVWM9GxZxT_ROXb2Q_RL8FMjFIikkHeBw_nJ5vUMNB2JMbh_37nMciDlbRkWAvprqLryu-wJ4_vFFqpf7Hu-4Vi72Zzw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...MYBjK6Q11zycFwVGgfnunBorzuvD1CkpREgP434ZKCVCiApylLiX8M9998wmz2N5_qj-4X7aEfp90wlkk2ARGv7cyixqarqrkw7xNiIQjO1B8BfA%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...MYBjK6Q11zycFwVGgfnunBorzuvD1CkpREgP434ZKCVCiApylLiX8M9998wmz2N5_qj-4X7aEfp90wlkk2ARGv7cyixqarqrkw7xNiIQjO1B8BfA%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...U5pb23xHnbQ56mQYSBgYH79mWJgQV5Z8XTmldBHcyXqsMnh1hHNRDYUE06PZfeXuY8uE06ewUhqZV6efbEFidma-zffdwHYbfwv3TJ-5NEy0c1Dw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...U5pb23xHnbQ56mQYSBgYH79mWJgQV5Z8XTmldBHcyXqsMnh1hHNRDYUE06PZfeXuY8uE06ewUhqZV6efbEFidma-zffdwHYbfwv3TJ-5NEy0c1Dw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...cScuKzZFy-bWtakCOdLGvOS3Jy5DZj2cl2FvnTfPUkhwR1wLPljh9X7IKi9txGVQqx9vOeNEoIpp88QM_IeXPWxS_C5QmY7gBez2cabbmEyD3-kw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...cScuKzZFy-bWtakCOdLGvOS3Jy5DZj2cl2FvnTfPUkhwR1wLPljh9X7IKi9txGVQqx9vOeNEoIpp88QM_IeXPWxS_C5QmY7gBez2cabbmEyD3-kw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...kJaoP_evgpZ6vYhaETOSrt7nFKMUnKw6dHfDjKCutz0w-Kh94smnCfKf48Bu0DZ9TNKuUNEPDl2XEdZYo6xNrJQ63pMa6vHibXfkVMVCsUyAPVOQ%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...vpjb0V0KASGADMxta7iMe5vkP7108dfp9OgNIZVG0aIcrmG3tjv8az6vZHHjz4tNUFdiKpHABp9h9LT-uXSoZP-T2WdVNXYLoKmQerO0uuCPt3Fw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...vpjb0V0KASGADMxta7iMe5vkP7108dfp9OgNIZVG0aIcrmG3tjv8az6vZHHjz4tNUFdiKpHABp9h9LT-uXSoZP-T2WdVNXYLoKmQerO0uuCPt3Fw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...mZE3NDUd5GPW1U5HlQDoYOUQhDOJMdBnAO0RvU1L_0zaBK16baValXRqxm2EHBWtxqASYf3Bt3NhfMkt4hJiPiKZUzfw7htWwERntHkUsqdkPSUw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...mZE3NDUd5GPW1U5HlQDoYOUQhDOJMdBnAO0RvU1L_0zaBK16baValXRqxm2EHBWtxqASYf3Bt3NhfMkt4hJiPiKZUzfw7htWwERntHkUsqdkPSUw%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GoogleCloudDnsTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/googleclouddns.py:175: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/oauth2/v4/token' body = 'grant_type=urn%3Aietf%3Aparams%3Aoauth%3Agrant-type%3Ajwt-bearer&assertion=eyJhbGciOiAiUlMyNTYiLCAidHlwIjogIkpXVCJ9.e...A24VvP4yYJyRwuslBpFM4cNFRa1ibMSetIN_r_gtP_oje6nB-HyUs7jZWgfCaKV2BbGzvxk94KuAHflqNYhb6y_bZVXqW9V0-tVLhozmAT2pdfWg%3D%3D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '773'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ GransyProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_authenticate_with_unmanaged_domain_should_fail _ self = > ??? tests/providers/integration_tests.py:115: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GransyProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:417: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gransy.py:46: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/client.py:76: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:86: in __init__ ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:89: in load ??? /usr/lib/python3.13/site-packages/zeep/wsdl/wsdl.py:149: in _get_xml_document ??? /usr/lib/python3.13/site-packages/zeep/loader.py:89: in load_external ??? /usr/lib/python3.13/site-packages/zeep/transports.py:122: in load ??? /usr/lib/python3.13/site-packages/zeep/transports.py:134: in _load_remote_data ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wsdl', body = None headers = {'User-Agent': 'Zeep/4.3.1 (www.python-zeep.org)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=300, read=300, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ GratisDNSProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ GratisDNSProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/gratisdns.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/' body = 'action=logmein&login=placeholder_auth_username&password=placeholder_auth_password' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '81', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ HenetProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HenetProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/henet.py:48: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ HetznerProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HetznerProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hetzner.py:39: in authenticate ??? src/lexicon/_private/providers/hetzner.py:179: in _get_zone_by_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/hetzner.py:153: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones?name=hetzner-api-test.de', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Auth-API-Token': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _________________ FooProviderTests.test_provider_authenticate __________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError __ FooProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ FooProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hostingde.py:36: in authenticate ??? src/lexicon/_private/providers/hostingde.py:50: in _get_zone_config ??? src/lexicon/_private/providers/hostingde.py:215: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/dns/v1/json/zoneConfigsFind' body = '{"filter": {"field": "ZoneName", "value": "eruza.de"}, "authToken": "placeholder_auth_token", "page": 1}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '104'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ HoverProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ HoverProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/hover.py:50: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:73: in get return request("get", url, params=params, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/signin', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ InfobloxProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfobloxProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infoblox.py:99: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/wapi/v2.6.1/zone_auth?fqdn=test.local&view=internal' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic aW5mb2Jsb3g6ZGVmYXVsdA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ InfomaniakProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InfomaniakProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/infomaniak.py:47: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/infomaniak.py:202: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/1/product?service_name=domain&customer_name=testouille.at', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ InternetbsProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ InternetbsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/internetbs.py:50: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/internetbs.py:204: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/Domain/List?searchTermFilter=world-of-ysera.com&ResponseFormat=json&ApiKey=placeholder_auth_key&Password=placeholder_auth_password' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ JokerProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ JokerProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/joker.py:67: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/request/login', body = 'api-key=placeholder_auth_token' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '30', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ LinodeProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LinodeProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode.py:154: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/?api_key=placeholder_auth_token&resultFormat=JSON&api_action=domain.list' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ Linode4ProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Linode4ProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/linode4.py:34: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/linode4.py:164: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection... 'Authorization': 'Bearer placeholder_auth_token', 'X-Filter': '{"domain": "lexicon-test.com"}', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ LuaDNSProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ LuaDNSProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/luadns.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/luadns.py:140: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ MemsetProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MemsetProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/memset.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/memset.py:146: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/json/dns.zone_domain_info?domain=testzone.com' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjp4'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ MisakaProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MisakaProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/misaka.py:64: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/misaka.py:228: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/zones/misaka-dns-test.stream/settings', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...eep-alive', 'Content-Type': 'application/json', 'Authorization': 'Token placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ MythicBeastsProviderTests.test_provider_authenticate _____________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ MythicBeastsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/mythicbeasts.py:56: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/login', body = 'grant_type=client_credentials' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv.../x-www-form-urlencoded', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ NamecomProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ___________ NamecomProviderTests.test_provider_authentication_method ___________ self = > ??? tests/providers/test_namecom.py:55: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_MX_with_no_priority _ self = > ??? tests/providers/test_namecom.py:80: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_MX_with_priority _ self = > ??? tests/providers/test_namecom.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_should_fail_on_http_error _ self = > ??? tests/providers/test_namecom.py:88: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_should_pass_if_no_record_to_delete _ self = warning = > ??? tests/providers/test_namecom.py:154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_with_no_identifier_or_rtype_and_name_should_fail _ self = > ??? tests/providers/test_namecom.py:145: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_by_identifier_with_no_other_args_should_pass _ self = > ??? tests/providers/test_namecom.py:134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_filter_by_content_should_pass _ self = > ??? tests/providers/test_namecom.py:125: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_should_fail_if_multiple_records_to_update _ self = > ??? tests/providers/test_namecom.py:117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_should_fail_if_no_record_to_update _ self = > ??? tests/providers/test_namecom.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NamecomProviderTests.test_provider_when_calling_update_record_with_no_identifier_or_rtype_and_name_should_fail _ self = > ??? tests/providers/test_namecom.py:101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/namecom.py:67: in authenticate ??? src/lexicon/_private/providers/namecom.py:36: in __iter__ ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namecom.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v4/domains?page=1', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ NameSiloProviderTests.test_provider_authenticate _______________ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError _ NameSiloProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/namesilo.py:36: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/namesilo.py:142: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/api/getDomainInfo?domain=capsulecdfake.com&version=1&type=xml&key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/namesilo.py:38: AuthenticationError ________________ NetcupProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ NetcupProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/netcup.py:47: in authenticate ??? src/lexicon/_private/providers/netcup.py:172: in _apicall ??? src/lexicon/_private/providers/netcup.py:189: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/run/webservice/servers/endpoint.php?JSON' body = '{"action": "login", "param": {"customernumber": "placeholder_auth_customer_id", "apikey": "placeholder_auth_api_key", "apipassword": "placeholder_auth_api_password"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '166'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _________________ NFSNProviderTests.test_provider_authenticate _________________ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104093;mCZRKCaeuNzJhROk;438b7f6d565baef1880eb12bbca8d77dc0363305', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104093;6YzMhItuajA5zIH9;99f0df255e85e7fa5d7ee53e70d35d8acaf0a381', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104094;R6hja5vbDLip6Hq1;f5f51dd8b4b9cdad24511ebe3e4d794de1b90542', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104094;S4anXI7vzWtMg5FE;45e483a52e7e353fbb11477d79e50ba2a75c3371', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104094;vsg1zHgwjEXTNElm;6567401bc1b4914b3661f3c77f75e01541be38d5', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104094;baeSPRRiJA5IB3Fq;dcd4cff24dc369f2ce1af48790c685e6e43d576b', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104094;oehc99PmfnUTkdPF;7f535a09819ab798c59d50f5937cef5cf40ec650', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104095;DSE4xXacFgUN35iK;8a062a94220f22248ec800ab0bb156ad42a58feb', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104095;JVCnDlX4NcgoQ0mT;a3951f9e5f02388730c12d4daa0990a5307754e0', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104095;spqlHUbrbpGp4r3c;71e806c141f4f185d7270bd2b28b914b1b40656c', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104095;3XOx2jJJRDlUkvVs;197f89711d618a25193b2d787e24e40b8748c9af', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104096;2khfZUgk70JpcBet;0f8588098b0b69531d15ece98ef9fa149dfac39e', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104096;oD2TuDO3ZNoBzc4x;575aaa0c2b1588ae84cbe401863761d35ded92af', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104096;v8ADCfFk7KTeIs46;5859c6cae8ebeb3d2b8ffe6f273a5125044dbe92', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104096;wJitorPakkVX6aZT;655dd54e0488bab9d96d0f4293310bb11af2235a', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104097;vu67gwE6ijrDsQlo;5f0e80614530db7389ba7c9286ab87ed0275791d', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104097;dNpZATUIVplCWFB4;bd71d66612b93e135749c5d2e2a0cd670763b935', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104097;O0lJCfXgwd6w24m3;ee749318df56a8e0f4a18d02558990425b103ce4', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104097;FEE4vrZtTyRhUrQ5;4a6829a06580ed007cf0e0c84d7b3dca9044b461', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104098;yg8FVsucHE7w9AIj;47cefb9e8ecff6429f01725801805901f9bf1797', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104098;NOocXEiTYrk1QqtG;c89bd65a2398a7bc4e9810f2796308d9e0ba718f', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104098;owVSFm2pVBgITOCr;dc86f1c59ed9d6aa62bf8ca1900345efa6314595', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104098;A87jHMKWar0rq9kj;ab2a5dcfbc7ac1c83de0764b77180c7e38c6e824', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104099;WjPZLMME0STUKyDm;443691ee1bce02f728280b8a81fc42504c4ddfc6', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError _ NFSNProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/nfsn.py:44: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/nfsn.py:178: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/dns/koupia.xyz/listRRs', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...placeholder_auth_username;1736104099;bMYDFrjxBYhmHqqr;af0d3ca4791fa9a3a6908ab46beefa465570cecd', 'Content-Length': '0'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/nfsn.py:46: AuthenticationError ________________ NjallaProviderTests.test_provider_authenticate ________________ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _ NjallaProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? src/lexicon/_private/providers/njalla.py:34: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/njalla.py:124: in _api_call ??? src/lexicon/_private/providers/njalla.py:150: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/1/' body = b'{"method": "get-domain", "params": {"domain": "example.com"}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...p-alive', 'Authorization': 'Njalla placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '61'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = > ??? E lexicon.exceptions.AuthenticationError: 'VCRHTTPResponse' object has no attribute 'version_string' src/lexicon/_private/providers/njalla.py:36: AuthenticationError _________________ Ns1ProviderTests.test_provider_authenticate __________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Ns1ProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/nsone.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/nsone.py:253: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v1/zones/lexicon-example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'X-NSONE-Key': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ OnappProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnappProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/onapp.py:57: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/onapp.py:188: in _request ??? /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns_zones.json', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...t-Type': 'application/json', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ OnlineProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OnlineProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/online.py:41: in authenticate ??? src/lexicon/_private/providers/online.py:55: in _init_zones ??? src/lexicon/_private/providers/online.py:47: in _list_zones ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/online.py:249: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domain/capsulecd.com/version', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _________________ OvhProviderTests.test_provider_authenticate __________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError __ OvhProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ OvhProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ovh.py:90: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/1.0/auth/time', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ PleskProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PleskProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/plesk.py:116: in authenticate ??? src/lexicon/_private/providers/plesk.py:111: in __find_site ??? src/lexicon/_private/providers/plesk.py:65: in __simple_request ??? src/lexicon/_private/providers/plesk.py:97: in __plesk_request ??? /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/enterprise/control/agent.php' body = '\nlexicon-test.com' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...'Content-Length': '137', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Bhc3N3b3Jk'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ PointHqProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PointHqProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/pointhq.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/pointhq.py:139: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/zones/capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ PorkbunProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PorkbunProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/porkbun.py:47: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/porkbun.py:162: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/api/json/v3/ping' body = '{"apikey": "placeholder_auth_key", "secretapikey": "placeholder_auth_secret"}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '77'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ PowerdnsProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ PowerdnsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/powerdns.py:110: in authenticate ??? src/lexicon/_private/providers/powerdns.py:104: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/powerdns.py:260: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/servers/localhost/zones/sometestdomain.com.' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-API-Key': 'placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ RackspaceProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RackspaceProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rackspace.py:80: in authenticate ??? src/lexicon/_private/providers/rackspace.py:279: in _auth_request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v2.0/tokens' body = '{"auth": {"RAX-KSKEY:apiKeyCredentials": {"username": "placeholder_auth_username", "apiKey": "placeholder_auth_api_key"}}}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '122'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ Rage4ProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ Rage4ProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rage4.py:36: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rage4.py:155: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/rapi/getdomainbyname/?name=capsulecd.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF91c2VybmFtZTpwbGFjZWhvbGRlcl9hdXRoX3Rva2Vu'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ RcodezeroProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ RcodezeroProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/rcodezero.py:35: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/rcodezero.py:176: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/zones/lexicon-test.at', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ Route53ProviderTests.test_provider_authenticate ________________ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0fb59df43eb550c057', 'amz-sdk-invocation-id': b'5be4be27-adf9-4b3e-90dc-522bda093f7b', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError ______ Route53ProviderTests.test_provider_authenticate_private_zone_false ______ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0fb59df43eb550c057', 'amz-sdk-invocation-id': b'61bfadbf-14fd-4b1e-86f1-d3544b8dac9c', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/test_route53.py:35: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0fb59df43eb550c057', 'amz-sdk-invocation-id': b'563bc5de-9b1d-43b0-918f-3d2107da56ad', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0fb59df43eb550c057', 'amz-sdk-invocation-id': b'2e697b4f-d5e9-4d75-9091-3791fba38e51', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...7a265d36a27d258910', 'amz-sdk-invocation-id': b'7e9f441f-d73c-4e39-8ece-af98ebe9c64c', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...7a265d36a27d258910', 'amz-sdk-invocation-id': b'88596556-ebf2-4e27-9a47-2c47c4469662', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...7a265d36a27d258910', 'amz-sdk-invocation-id': b'2eba3b4d-276a-49ad-a8da-e692b8f2a0dc', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...7a265d36a27d258910', 'amz-sdk-invocation-id': b'f455807a-ee06-4aaf-82e4-aa6ca4684d8c', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...7a265d36a27d258910', 'amz-sdk-invocation-id': b'ec7e1de5-5a50-4562-9db8-38c8cafcaf9a', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...315ea04645bcbb5d01', 'amz-sdk-invocation-id': b'6c4f33ff-2ff0-435e-a8cf-4eafaff106c3', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...315ea04645bcbb5d01', 'amz-sdk-invocation-id': b'a86aa050-ba2d-4c20-9211-0e1064fa16c4', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...315ea04645bcbb5d01', 'amz-sdk-invocation-id': b'e2218513-2c3d-4883-af42-19b41369d83f', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...315ea04645bcbb5d01', 'amz-sdk-invocation-id': b'1f1b59db-7bc9-454c-9c71-3b39c3ff22a2', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...bb150a44b6205f4aef', 'amz-sdk-invocation-id': b'569f52bc-ca73-44c1-8a4f-054b7b9dea7b', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...bb150a44b6205f4aef', 'amz-sdk-invocation-id': b'452b1fc2-4938-47ef-be5b-1bbe4ed8201c', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...bb150a44b6205f4aef', 'amz-sdk-invocation-id': b'caf329e8-fe00-4522-adf1-04dd684c8eff', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...bb150a44b6205f4aef', 'amz-sdk-invocation-id': b'0d923279-3860-4895-9b09-47c80a7f3de5', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0c996a731addfb2fd3', 'amz-sdk-invocation-id': b'185b0796-3903-4e26-a0b8-d4a1d4329cd6', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0c996a731addfb2fd3', 'amz-sdk-invocation-id': b'555523e2-18d2-4c24-a3e3-adf47c0ca67e', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0c996a731addfb2fd3', 'amz-sdk-invocation-id': b'1692eb01-cbf2-40c2-8d1c-1295f708663c', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0c996a731addfb2fd3', 'amz-sdk-invocation-id': b'cb8040c7-ecd6-4f0e-884f-b927f0259fc9', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...0c996a731addfb2fd3', 'amz-sdk-invocation-id': b'c56714bb-1fc3-431a-b0ef-27833b7c7f14', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...ea47308e9e4f52be36', 'amz-sdk-invocation-id': b'b3111960-2014-42c4-808a-e267262a3a58', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...ea47308e9e4f52be36', 'amz-sdk-invocation-id': b'31b211fd-4c36-4d75-8677-b839895b7b68', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...ea47308e9e4f52be36', 'amz-sdk-invocation-id': b'82d5c40b-5975-401c-9400-380e4dbf7452', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _ Route53ProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) > urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) /usr/lib/python3.13/site-packages/botocore/httpsession.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/2013-04-01/hostedzonesbyname', body = None headers = {'User-Agent': b'Boto3/1.35.88 md/Botocore#1.35.88 ua/2.0 os/linux#6.1.80-2-sophgo-11457-g83ab3eda46e6 md/arch#riscv64...ea47308e9e4f52be36', 'amz-sdk-invocation-id': b'e2fcf357-a792-497e-b053-99fd8bffc81b', 'amz-sdk-request': b'attempt=1'} retries = Retry(total=False, connect=None, read=None, redirect=0, status=None) timeout = Timeout(connect=60, read=60, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError During handling of the above exception, another exception occurred: self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/route53.py:144: in authenticate ??? src/lexicon/_private/providers/route53.py:167: in _lookup_hosted_zone ??? /usr/lib/python3.13/site-packages/botocore/client.py:569: in _api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1005: in _make_api_call ??? /usr/lib/python3.13/site-packages/botocore/client.py:1029: in _make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:119: in make_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:200: in _send_request ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:360: in _needs_retry ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:412: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:256: in emit ??? /usr/lib/python3.13/site-packages/botocore/hooks.py:239: in _emit ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:207: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:284: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:307: in _should_retry ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:363: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:247: in __call__ ??? /usr/lib/python3.13/site-packages/botocore/retryhandler.py:416: in _check_caught_exception ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:279: in _do_get_response ??? /usr/lib/python3.13/site-packages/botocore/endpoint.py:383: in _send ??? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = def send(self, request): try: proxy_url = self._proxy_config.proxy_url_for(request.url) manager = self._get_connection_manager(request.url, proxy_url) conn = manager.connection_from_url(request.url) self._setup_ssl_cert(conn, request.url, self._verify) if ensure_boolean( os.environ.get('BOTO_EXPERIMENTAL__ADD_PROXY_HOST_HEADER', '') ): # This is currently an "experimental" feature which provides # no guarantees of backwards compatibility. It may be subject # to change or removal in any patch version. Anyone opting in # to this feature should strictly pin botocore. host = urlparse(request.url).hostname conn.proxy_headers['host'] = host request_target = self._get_request_target(request.url, proxy_url) urllib_response = conn.urlopen( method=request.method, url=request_target, body=request.body, headers=request.headers, retries=Retry(False), assert_same_host=False, preload_content=False, decode_content=False, chunked=self._chunked(request.headers), ) http_response = botocore.awsrequest.AWSResponse( request.url, urllib_response.status, urllib_response.headers, urllib_response, ) if not request.stream_output: # Cause the raw stream to be exhausted immediately. We do it # this way instead of using preload_content because # preload_content will never buffer chunked responses http_response.content return http_response except URLLib3SSLError as e: raise SSLError(endpoint_url=request.url, error=e) except (NewConnectionError, socket.gaierror) as e: raise EndpointConnectionError(endpoint_url=request.url, error=e) except ProxyError as e: raise ProxyConnectionError( proxy_url=mask_proxy_url(proxy_url), error=e ) except URLLib3ConnectTimeoutError as e: raise ConnectTimeoutError(endpoint_url=request.url, error=e) except URLLib3ReadTimeoutError as e: raise ReadTimeoutError(endpoint_url=request.url, error=e) except ProtocolError as e: raise ConnectionClosedError( error=e, request=request, endpoint_url=request.url ) except Exception as e: message = 'Exception received when sending urllib3 HTTP request' logger.debug(message, exc_info=True) > raise HTTPClientError(error=e) E botocore.exceptions.HTTPClientError: An HTTP Client raised an unhandled exception: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/botocore/httpsession.py:509: HTTPClientError _______________ SafednsProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SafednsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/safedns.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/safedns.py:193: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/safedns/v1/zones/lexicon.tests', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ SakruaCloudProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SakruaCloudProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/sakuracloud.py:40: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/sakuracloud.py:227: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/cloud/zone/is1a/api/cloud/1.1/commonserviceitem?%7B%22Filter%22:%20%7B%22Provider.Class%22:%20%22dns%22,%20%22Name%22:%20%22example.com%22%7D%7D' body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...son', 'Content-Length': '2', 'Authorization': 'Basic cGxhY2Vob2xkZXJfYXV0aF90b2tlbjpwbGFjZWhvbGRlcl9hdXRoX3NlY3JldA=='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ SoftLayerProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ SoftLayerProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/softlayer.py:55: in authenticate ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:263: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/utils.py:285: in resolve_ids ??? /usr/lib/python3.13/site-packages/SoftLayer/managers/dns.py:31: in _get_zone_id_from_name ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:610: in call_handler ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:578: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/API.py:302: in call ??? /usr/lib/python3.13/site-packages/SoftLayer/transports/xmlrpc.py:92: in __call__ ??? /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/xmlrpc/v3.1/SoftLayer_Account' body = b"\n\ngetDomains\n\n\n<...>\n\n\n\n\n\n\n\n\n" headers = {'User-Agent': 'softlayer-python/v6.1.4', 'Accept-Encoding': 'gzip, deflate, compress', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Type': 'application/xml', 'Content-Length': '855'} retries = Retry(total=10, connect=3, read=None, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ TimewebProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TimewebProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/timeweb.py:88: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/timeweb.py:66: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api/v1/domains/example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ TransipProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "d67d3616a52340ad89eea3b650f749ab", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...YSRLW8w4iCynmfyQAyW+07Jr0Sgfh8SmqxeIFg+a9PQT5du92U3hXw==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "9dd441515fe545029cc3d9788926208c", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...4HytWvVdWHG9+myteay/gu7IBcwVYvlTD8YArJHjN8IfMlV0kBM4UA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/test_transip.py:58: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "503ec326b0b142f7b65ccbc1292e85b8", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...f4Y/yPWfPHPDx2Dfn4czJYULbAE/XJthwDes7hnmVeTvzTnHCnUYPA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "878ae8f09ec2421281fddf5c7a629d9c", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...WLUP3di40LSSfV3tY/xqvXUNE6IPhxUZqXvKLzv/tXxD6BgbG9739A==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "3e4272f322304658bc49fabe98514afa", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...IrdTgWqdXEXfDrC+vd4VyVwRq8lODxYqRk1Bcj0lObC0TntEf69YjA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "23a2ecaab9794b5b99d4b826e061c958", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...9ggWl0V6w4eqXbib+8FvyayX4gmsuK1wMavWq4DL/MnoaTlDiNLrdw==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "6e6b108c866f40c2b7299ee9e9aff3d3", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...34o8njWWq494dFqgt2y4nUDGPW0gwilZv+SugG8Ui8SAAtYJLAtQLw==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "384efda1bdb14a09995897b5368ab352", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...Uy0ZcKAPakxZhj/ShH7d0ZguxSJFVNDNnvAgCtjG+IRz1cBdzN96lA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "b5743f46c4714a8d9b167742896184a1", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...WYaw+k9/clPc2wSO7QPEbM15dUzAQGhes0EqekG890SCIHvnfOkd5A==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "2b9642c5492a4dea832f9f1b37e526bd", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...xefm7BrB5bSYBsW9mQNikRCldzd1cQmLRSfHLzx02A6FON5b2kfjvg==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "2b507d50b80a458f936a3e14111d2409", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...J/pniue3+eHHff2iHn7D112rMw9QAOq/3EL1l1ZCzviG9blDYok4Hg==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "6c0f693ac833497995e7d820fd5084ec", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...16sO1DxUQFQAXAPtmhvEWyQYvVOge4JZ7kGRBo2w3yfYpPJEMB3Q7w==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "2035b7a640cd4bcc85bbb3fd1e990d22", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...9FxvWWoYAk/wMVTaWc5/jTbTzbfiNemaJk1Vts3JgoIboFhQKJQGRw==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "8c3188912b4049c4a66c3c1a4d393c13", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...vgk2962wIUCAXZv95wFW/Vm1az+4dUCltUyJXnzvZV/JuUupk6JzXg==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "c5a6643c32184a129a4efb969bae6330", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...zJmnEJvRq0ChFsTMu3wE7x+XyaETcyaeg5dZC0FXu0XDtObQ057DnA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "62599bebb01d4041bcb52501ff7792c7", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...elcKP62eRmBvWVWtX2WPsk54u3xsf2yyhuEYf1ejXoz9D1iwwR6Dww==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "ccf4ddb3bdeb49a3b7dddc64980f5785", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...LjJr4wV3ib9M9ovvaZExn3x6kEt6PQwjM1Phq1ZQYb/sDLNNFmMSZg==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "af7a4ac5921f49888ef0762ae9d67de0", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...LLaNao3m4YSBZWHXpXPqZ0wREs604lqMnDYWcj0s4CBIguS16hYKsA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "0863f203a9464d55b0d2d32512024848", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...Z3QNjTjhu03H9spMsJp9zj9FiG5r6dz7aGMsEOXTyYvff2aiwxZUZQ==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "97f51c4ee80746d6a0ebc0004d2c7c9f", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...nt7sDU1B093ksRyYFExgEBTChtnsFs+3IoOR1e1XgCHVRvNXHIB8hQ==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "ae09e2bbd3bd478b9916856ce9da4cdf", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...heuwoLsPRxHKDVqOj30cZlR2COG1CCpUviysOxjS0ipw18WJ7mfaQQ==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "91192aa0abe04b4ab9e0774c8214bedd", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...W5D1zHFcL+OFvbDY7TCqMBwYDP1mbITGKfwdhu8t0IbwLGyo2fPmJA==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "30d0e15352324d49a588bb10a06345da", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...val8IPzIbf3d9FXORvBhJI1B7Oxtt9nHmR0EFNtMCgFW7WGRzI8KZw==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "2bbcc8da52b744aa9cdd8d17751ad8ad", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...U368hyMrYjJirY7s2a3c5wuwiNn8hK43P8tEAM1t4G6NqKesJjadDQ==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ TransipProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/transip.py:94: in authenticate ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/v6/auth' body = b'{"login": "placeholder_auth_username", "nonce": "196737deffb245689b857743a5b6f84d", "global_key": true}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-aliv...4x3h/L1ei7m3Oj5e5RfMP9KB6b7EMN/g31nYTA7wdIVdbYEakmxhXQ==', 'Content-Length': '103', 'Content-Type': 'application/json'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _______________ UltradnsProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ UltradnsProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/ultradns.py:92: in authenticate ??? src/lexicon/_private/providers/ultradns.py:69: in zone_data ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/ultradns.py:261: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/zones/example-abtest.com./rrsets', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Type': 'application/json', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ VercelProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VercelProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vercel.py:46: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vercel.py:179: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v5/domains/fullcr1stal.tk', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ VultrProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ VultrProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/vultr.py:33: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/vultr.py:180: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/v2/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Authorization': 'Bearer placeholder_auth_token'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ WebgoProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WebgoProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/webgo.py:44: in authenticate ??? /usr/lib/python3.13/site-packages/requests/sessions.py:602: in get return self.request("GET", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ WedosProviderTests.test_provider_authenticate _________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ WedosProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/wedos.py:59: in authenticate ??? src/lexicon/interfaces.py:171: in _post return self._request("POST", url, data=data, query_params=query_params) src/lexicon/_private/providers/wedos.py:226: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'POST', url = '/wapi/json' body = 'request=%7B%22request%22%3A+%7B%22user%22%3A+%22placeholder_auth_username%22%2C+%22auth%22%3A+%2259803c19535672f6ade1c6fa8ecd263b18b9295c%22%2C+%22command%22%3A+%22dns-domains-list%22%2C+%22data%22%3A+%22%22%7D%7D' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '213', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ______________ YandexPDDProviderTests.test_provider_authenticate _______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexPDDProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandex.py:43: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandex.py:181: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/api2/admin/dns/list?domain=example.com', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'PddToken': 'placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _____________ YandexCloudProviderTests.test_provider_authenticate ______________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ YandexCloudProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/yandexcloud.py:81: in authenticate ??? src/lexicon/_private/providers/yandexcloud.py:95: in _cloud_id_matches_domain ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/yandexcloud.py:362: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/zones/dns3a9nospukjt4jlqdm', body = '{}' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection...ep-alive', 'Content-Type': 'application/json', 'Authorization': 'Bearer placeholder_auth_token', 'Content-Length': '2'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ ZiloreProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/test_zilore.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = > ??? tests/providers/integration_tests.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = > ??? tests/providers/integration_tests.py:475: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = > ??? tests/providers/integration_tests.py:541: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = > ??? tests/providers/integration_tests.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = > ??? tests/providers/integration_tests.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = > ??? tests/providers/integration_tests.py:501: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = > ??? tests/providers/integration_tests.py:251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZiloreProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zilore.py:44: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zilore.py:185: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/dns/v1/domains', body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'X-Auth-Key': 'placeholder_auth_key'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError ________________ ZonomiProviderTests.test_provider_authenticate ________________ self = > ??? tests/providers/integration_tests.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:133: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = > ??? tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = > ??? tests/providers/integration_tests.py:147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = > ??? tests/providers/integration_tests.py:140: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = > ??? tests/providers/integration_tests.py:303: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = > ??? tests/providers/integration_tests.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = > ??? tests/providers/integration_tests.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_list_records_after_setting_ttl _ self = > ??? tests/providers/integration_tests.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = > ??? tests/providers/integration_tests.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = > ??? tests/providers/integration_tests.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = > ??? tests/providers/integration_tests.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError _ ZonomiProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = > ??? tests/providers/integration_tests.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/providers/integration_tests.py:418: in _construct_authenticated_provider ??? src/lexicon/_private/providers/zonomi.py:69: in authenticate ??? src/lexicon/interfaces.py:163: in _get return self._request("GET", url, query_params=query_params) src/lexicon/_private/providers/zonomi.py:203: in _request ??? /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) /usr/lib/python3.13/site-packages/requests/adapters.py:667: in send resp = conn.urlopen( /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:787: in urlopen response = self._make_request( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET' url = '/app/dns/dyndns.jsp?action=QUERY&name=%2A%2A.pcekper.com.ar&type=SOA&api_key=placeholder_auth_token' body = None headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) raise new_e # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: conn.request( method, url, body=body, headers=headers, chunked=chunked, preload_content=preload_content, decode_content=decode_content, enforce_content_length=enforce_content_length, ) # We are swallowing BrokenPipeError (errno.EPIPE) since the server is # legitimately able to close the connection after sending a valid response. # With this behaviour, the received response is still readable. except BrokenPipeError: pass except OSError as e: # MacOS/Linux # EPROTOTYPE and ECONNRESET are needed on macOS # https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/ # Condition changed later to emit ECONNRESET instead of only EPROTOTYPE. if e.errno != errno.EPROTOTYPE and e.errno != errno.ECONNRESET: raise # Reset the timeout for the recv() on the socket read_timeout = timeout_obj.read_timeout if not conn.is_closed: # In Python 3 socket.py will catch EAGAIN and return None when you # try and read into the file pointer created by http.client, which # instead raises a BadStatusLine exception. Instead of catching # the exception and assuming all BadStatusLine exceptions are read # timeouts, check for a zero timeout before making the request. if read_timeout == 0: raise ReadTimeoutError( self, url, f"Read timed out. (read timeout={read_timeout})" ) conn.timeout = read_timeout # Receive the response from the server try: response = conn.getresponse() except (BaseSSLError, OSError) as e: self._raise_timeout(err=e, url=url, timeout_value=read_timeout) raise # Set properties that are used by the pooling layer. response.retries = retries response._connection = response_conn # type: ignore[attr-defined] response._pool = self # type: ignore[attr-defined] log.debug( '%s://%s:%s "%s %s %s" %s %s', self.scheme, self.host, self.port, method, url, > response.version_string, response.status, response.length_remaining, ) E AttributeError: 'VCRHTTPResponse' object has no attribute 'version_string' /usr/lib/python3.13/site-packages/urllib3/connectionpool.py:551: AttributeError =========================== short test summary info ============================ FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_authenticate FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_aliyun.py::AliyunProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_authenticate FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_aurora.py::AuroraProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_authenticate FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_authenticate FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_azure.py::AzureTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_authenticate FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_cloudflare.py::CloudflareProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_authenticate FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_cloudns.py::CloudnsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_authenticate FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_cloudxns.py::CloudXNSProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_authenticate FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_conoha.py::ConohaProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_authenticate FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_constellix.py::ConstellixProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_authenticate FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_digitalocean.py::DigitalOceanProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_authenticate FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dinahosting.py::DinahostingProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_authenticate FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_directadmin.py::DirectAdminProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_authenticate FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dnsimple.py::DnsimpleProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_authenticate FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dnsmadeeasy.py::DnsmadeeasyProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_authenticate FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dnspark.py::DnsParkProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_authenticate FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dnspod.py::DnsPodProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_authenticate FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dnsservices.py::DNSservicesProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_authenticate FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dreamhost.py::DreamhostProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_duckdns.py::DuckdnsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_authenticate FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_dynu.py::DynuProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_authenticate FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_easydns.py::EasyDnsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_authenticate FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_easyname.py::EasynameProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_authenticate FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_euserv.py::EUservProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_authenticate FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_exoscale.py::ExoscaleProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_authenticate FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_flexibleengine.py::FlexibleEngineProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_authenticate FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_gandi.py::GandiRESTProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_authenticate FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_gehirn.py::GehirnProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_authenticate FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_glesys.py::GlesysProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_authenticate FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_godaddy.py::GoDaddyProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_authenticate FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_googleclouddns.py::GoogleCloudDnsTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_authenticate FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_authenticate_with_unmanaged_domain_should_fail FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_gransy.py::GransyProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_authenticate FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_gratisdns.py::GratisDNSProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_authenticate FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_henet.py::HenetProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_authenticate FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_hetzner.py::HetznerProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_authenticate FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_hostingde.py::FooProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_authenticate FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_hover.py::HoverProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_authenticate FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_infoblox.py::InfobloxProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_authenticate FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_infomaniak.py::InfomaniakProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_authenticate FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_internetbs.py::InternetbsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_authenticate FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_joker.py::JokerProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_authenticate FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_linode.py::LinodeProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_authenticate FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_linode4.py::Linode4ProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_authenticate FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_luadns.py::LuaDNSProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_authenticate FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_memset.py::MemsetProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_authenticate FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_misaka.py::MisakaProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_authenticate FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_mythicbeasts.py::MythicBeastsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_authenticate FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_authentication_method FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_MX_with_no_priority FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_MX_with_priority FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_should_fail_on_http_error FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_should_pass_if_no_record_to_delete FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_with_no_identifier_or_rtype_and_name_should_fail FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_by_identifier_with_no_other_args_should_pass FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_filter_by_content_should_pass FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_should_fail_if_multiple_records_to_update FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_should_fail_if_no_record_to_update FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_namecom.py::NamecomProviderTests::test_provider_when_calling_update_record_with_no_identifier_or_rtype_and_name_should_fail FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_authenticate FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_namesilo.py::NameSiloProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_authenticate FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_netcup.py::NetcupProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_authenticate FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_nfsn.py::NFSNProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_authenticate FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_njalla.py::NjallaProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_authenticate FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_nsone.py::Ns1ProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_authenticate FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_onapp.py::OnappProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_authenticate FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_online.py::OnlineProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_authenticate FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_ovh.py::OvhProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_authenticate FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_plesk.py::PleskProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_authenticate FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_pointhq.py::PointHqProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_authenticate FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_porkbun.py::PorkbunProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_authenticate FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_powerdns.py::PowerdnsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_authenticate FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_rackspace.py::RackspaceProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_authenticate FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_rage4.py::Rage4ProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_authenticate FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_rcodezero.py::RcodezeroProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_authenticate FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_authenticate_private_zone_false FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_route53.py::Route53ProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_authenticate FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_safedns.py::SafednsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_authenticate FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_sakuracloud.py::SakruaCloudProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_authenticate FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_softlayer.py::SoftLayerProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_authenticate FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_timeweb.py::TimewebProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_authenticate FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_transip.py::TransipProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_authenticate FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_ultradns.py::UltradnsProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_authenticate FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_vercel.py::VercelProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_authenticate FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_vultr.py::VultrProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_authenticate FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_webgo.py::WebgoProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_authenticate FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_wedos.py::WedosProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_authenticate FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_yandex.py::YandexPDDProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_authenticate FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_yandexcloud.py::YandexCloudProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_authenticate FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_zilore.py::ZiloreProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_authenticate FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED tests/providers/test_zonomi.py::ZonomiProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record === 1836 failed, 203 passed, 327 skipped, 56 deselected in 490.47s (0:08:10) === ==> ERROR: A failure occurred in check().  Aborting... ==> ERROR: Build failed, check /var/lib/archbuild/extra-riscv64/root16/build [?25h[?25hreceiving incremental file list dns-lexicon-3.18.0-2-riscv64-build.log dns-lexicon-3.18.0-2-riscv64-check.log sent 62 bytes received 250,625 bytes 167,124.67 bytes/sec total size is 18,385,885 speedup is 73.34